Featured Article : Realtime Deepfake Dating Scams

Written by: Paul |

Here we look at how scammers are now reportedly using face-swapping technology to change their appearance in real-time to conduct video-based romance scams. 

Yahoo Boys 

Recently, tech news site ‘Wired’ featured a story about romance scammers dubbed ‘Yahoo Boys,’ a slang term for a Nigeria-based collective of scammers who are now using deepfakes and real-time face-swapping technology so they can take on any appearance in their video feed to the targets of their romance scams. They are also known to be involved in phishing, and other cybercrimes. 

Romance Scams 

A romance scam is a type of fraud where someone creates a fake identity to form a relationship with their target, often online, to deceive them into sending money or revealing personal or financial information.

How Big Is the Problem?  

According to the US FBI’s 2023 ‘Internet Crime Report’, the category of ‘confidence fraud/romance’ led to the theft of $652,544,805 from victims (which was actually down by a little over $83 million on the previous year).  This is clearly a significant problem and the real-time component of it will doubtless be factor in making this more prevalent. 

How? What Tech Have They Been Using? 

As highlighted by the research of David Maimon, Head of Fraud Insights at SentiLink and a professor at Georgia State University, who has been monitoring the ‘Yahoo Boys’ on Telegram for more than four years, they use phones, laptops and several different types of popular face-swapping software and apps to create their deepfakes. 

Also, it’s been noted (by Wired) that the so-called Yahoo Boys post videos of themselves online doing so, often showing their faces in the videos, and the videos and photos of their activities and recruitment are posted across many popular social media channels, including TikTok and Facebook.

Professor Maimon has also noted that the Yahoo Boys started using deepfakes for their scams as far back as 2022, meaning that they have gained quite a lot of experience around using these tools and tactics. 

Deepfake Call Types 

It’s also been observed (and highlighted by Wired) that the Yahoo Boys scammers use two different types of live deepfake calls to trick their targets. For example:

Using two phones and a face-swapping app. One phone is used to call the target (via Zoom), using the rear camera to record the screen of the second phone (which is pointing at the scammer’s face) and uses a face-swapping app. In this way, the person’s face the target sees on the real-time video call is completely different from the scammer’s real face.

The second method swaps a laptop for the phone, using a webcam and face-swapping software on the laptop to change the face of the scammer. It’s also been reported that videos made by the scammers of them using this method show that they are able to see their real face displayed alongside their deepfake face although it’s only the deepfake face that’s shown to the target in the video call.

Realistic … and Getting Better

In a LinkedIn post from Professor Maimon, showing an example of one of the scammer’s videos, he notes how “Yahoo boys are getting better using AI tools to bring stolen images of social dating users to live” and that the video example he posted “has piqued my interest due to its remarkably natural head movements, overshadowing the only noticeable flaw—the voice, which could be rectified with relative ease.” 

How To Spot Deepfake (Video Calls) 

On her X feed, Rachel Tobac, who describes herself as a ‘Hacker & CEO at SocialProof Security,’ offers some tips on how to help spot a deepfake video call, based on the latest deepfake calls available.  These are: 

– Get the person to stick out their tongue and move it around (tongue will look odd).

– Have the person move their head to the right & left or up & down to a large degree (it will look angular and boxy).

– Ask the person to get close to the camera and turn their head through a wide-angle (see angular boxy side of head). 

– Ask the person to add another person next to them in the call and have the original person walk away and come back to see if a deepfake ‘flops-over’ to a second face. 

– Look for discoloration around the scalp or circumference of the face (it may look like unblended makeup). 

– Look for light flickering in their hair when they move. 

Meeting In Person

As noted by contributor ‘Ally A’, to the LinkedIn post about the Yahoo Boys from Matt Burgess of Wired, a key piece of advice to people who may be involved in these kinds of romantic video calls is: “You can’t trust your eyes and ears anymore. If you can’t meet the person you are talking to online IN PERSON within 2-3 weeks of meeting, you have to assume that they are a scammer.” 

AI Advances Helping Scammers

The proliferation of AI technologies and their integration into various applications has inadvertently facilitated the activities of online scammers, including those involved in romance scams. AI-driven tools can now generate realistic and engaging text or images, enabling scammers to create convincing fake profiles and carry out sustained, personalised interactions without much effort – just as the Yahoo Boys have been doing. These sophisticated (but now widely available) tools can help scammers tailor their messages and responses based on the victim’s preferences and responses, making the deceit more believable. As a result, the barrier to entry for conducting such scams is lowered, allowing even those with minimal technical skills to now execute complex and convincing scams, thereby increasing the potential for exploitation and harm to unsuspecting individuals. 

How To Protect Yourself 

In addition to Rachel Tobac’s tip for spotting deepfakes (such as those used by the Yahoo Boys), some of the key ways people can protect themselves from falling victim to romance scammers, include: 

– Verify profiles. Conduct reverse image searches of profile pictures to check if they appear elsewhere on the internet, which can indicate a stolen image. 

– Slow down. Be cautious with individuals who escalate the relationship too quickly or profess love unusually early! 

– Keep personal information private. Avoid sharing sensitive personal information such as your address, financial details, or social security number. 

– Be very skeptical of requests for money. Be highly suspicious if the person you are communicating with requests money, especially if it is for an emergency or a seemingly urgent matter. 

– Use secure communication channels. Stick to the platform’s messaging services and avoid switching to less secure or private communication methods too soon. 

– Seek second opinions. Discuss your online relationship with friends or family to gain outside perspectives, especially if something feels off. 

– Report suspicious behavior. Report any suspicious profiles or messages to the dating platform and consider filing a complaint with relevant authorities if you suspect a scam. 

What Does This Mean For Your Business?

For businesses, understanding the dynamics of the evolving scam landscape, as demonstrated by the techniques employed by the “Yahoo Boys”, is crucial. These scammers, using readily available AI technologies such as deepfakes and real-time face-swapping, underscore a growing trend in cybercrime that leverages cutting-edge technology to exploit vulnerabilities in human psychology, particularly through emotional engagement.

The decentralised nature of these scam networks (where individuals or small groups operate in loose associations while sharing tactics and tools), presents a significant challenge to traditional cybersecurity measures. They operate with a brazen openness, often flaunting their capabilities on social media, which shows a troubling confidence in their ability to evade detection. 

The ease of access to AI tools means that the sophistication of scams can evolve as quickly as the technology develops. For businesses, this represents a clear and present danger not just in the form of romance scams targeted at individuals, but as a harbinger of more advanced AI-driven threats that could target companies directly. Phishing scams, impersonation, and business email compromise are just a few examples where similar technologies could be used to deceive employees or manipulate systems for fraudulent purposes. 

To safeguard against these threats, businesses need to enhance their defensive strategies by incorporating advanced detection systems that can identify anomalies in communication patterns, authenticate digital identities more robustly, and monitor for signs of emerging threats such as deepfakes. Training employees to recognise and report potential scams is also vital. Creating a culture of security awareness and providing tools to verify information independently can act as a crucial barrier against deception.